76 research outputs found

    Designing gamified interactive systems for empathy development

    Full text link
    [EN] The lack of empathy contributes to the development of insensitive social attitudes, so the design of interactive systems based on games or playful experiences for the development of empathic skills is of vital importance in the field of education. In this paper we propose a circular and iterative empathy development model and analyze gamification strategies that can be useful for the design of interactive systems that favor the development of empathy through pedagogical strategies in which users are exposed to affective, cognitive, reflective and social experiences that encourage the expression of behaviors of a prosocial nature.This work is funded by the European Development Regional Fund (EDRF-FEDER) and supported by the Spanish MINECO (project 2GETHER PID2019-108915RB-I00).LĂłpez-FaicĂĄn, L.; JaĂŠn MartĂ­nez, FJ. (2021). Designing gamified interactive systems for empathy development. Association for Computing Machinery. 27-29. https://doi.org/10.1145/3468002.3468236272

    Interactive spaces for children: gesture elicitation for controlling ground mini-robots

    Full text link
    [EN] Interactive spaces for education are emerging as a mechanism for fostering children's natural ways of learning by means of play and exploration in physical spaces. The advanced interactive modalities and devices for such environments need to be both motivating and intuitive for children. Among the wide variety of interactive mechanisms, robots have been a popular research topic in the context of educational tools due to their attractiveness for children. However, few studies have focused on how children would naturally interact and explore interactive environments with robots. While there is abundant research on full-body interaction and intuitive manipulation of robots by adults, no similar research has been done with children. This paper therefore describes a gesture elicitation study that identified the preferred gestures and body language communication used by children to control ground robots. The results of the elicitation study were used to define a gestural language that covers the different preferences of the gestures by age group and gender, with a good acceptance rate in the 6-12 age range. The study also revealed interactive spaces with robots using body gestures as motivating and promising scenarios for collaborative or remote learning activities.This work is funded by the European Development Regional Fund (EDRF-FEDER) and supported by the Spanish MINECO (TIN2014-60077-R). The work of Patricia Pons is supported by a national grant from the Spanish MECD (FPU13/03831). Special thanks are due to the children and teachers of the Col-legi Public Vicente Gaos for their valuable collaboration and dedication.Pons Tomás, P.; Jaén Martínez, FJ. (2020). Interactive spaces for children: gesture elicitation for controlling ground mini-robots. Journal of Ambient Intelligence and Humanized Computing. 11(6):2467-2488. https://doi.org/10.1007/s12652-019-01290-6S24672488116Alborzi H, Hammer J, Kruskal A et al (2000) Designing StoryRooms: interactive storytelling spaces for children. In: Proceedings of the conference on designing interactive systems processes, practices, methods, and techniques—DIS’00. ACM Press, New York, pp 95–104Antle AN, Corness G, Droumeva M (2009) What the body knows: exploring the benefits of embodied metaphors in hybrid physical digital environments. Interact Comput 21:66–75. https://doi.org/10.1016/j.intcom.2008.10.005Belpaeme T, Baxter PE, Read R et al (2013) Multimodal child–robot interaction: building social bonds. J Human-Robot Interact 1:33–53. https://doi.org/10.5898/JHRI.1.2.BelpaemeBenko H, Wilson AD, Zannier F, Benko H (2014) Dyadic projected spatial augmented reality. In: Proceedings of the 27th annual ACM symposium on user interface software and technology—UIST’14, pp 645–655Bobick AF, Intille SS, Davis JW et al (1999) The KidsRoom: a perceptually-based interactive and immersive story environment. Presence Teleoper Virtual Environ 8:367–391. https://doi.org/10.1162/105474699566297Bonarini A, Clasadonte F, Garzotto F, Gelsomini M (2015) Blending robots and full-body interaction with large screens for children with intellectual disability. In: Proceedings of the 14th international conference on interaction design and children—IDC’15. ACM Press, New York, pp 351–354Cauchard JR, E JL, Zhai KY, Landay JA (2015) Drone & me: an exploration into natural human–drone interaction. In: Proceedings of the 2015 ACM international joint conference on pervasive and ubiquitous computing—UbiComp’15. ACM Press, New York, pp 361–365Connell S, Kuo P-Y, Liu L, Piper AM (2013) A Wizard-of-Oz elicitation study examining child-defined gestures with a whole-body interface. In: Proceedings of the 12th international conference on interaction design and children—IDC’13. ACM Press, New York, pp 277–280Derboven J, Van Mechelen M, Slegers K (2015) Multimodal analysis in participatory design with children. In: Proceedings of the 33rd annual ACM conference on human factors in computing systems—CHI’15. ACM Press, New York, pp 2825–2828Dong H, Danesh A, Figueroa N, El Saddik A (2015) An elicitation study on gesture preferences and memorability toward a practical hand-gesture vocabulary for smart televisions. IEEE Access 3:543–555. https://doi.org/10.1109/ACCESS.2015.2432679Druin A (1999) Cooperative inquiry: developing new technologies for children with children. In: Proceedings of the SIGCHI conference on human factors computer system CHI is limit—CHI’99, vol 14, pp 592–599. https://doi.org/10.1145/302979.303166Druin A (2002) The role of children in the design of new technology. Behav Inf Technol 21:1–25. https://doi.org/10.1080/01449290110108659Druin A, Bederson B, Boltman A et al (1999) Children as our technology design partners. In: Druin A (ed) The design of children’s technology. Morgan Kaufman, San Francisco, pp 51–72Epps J, Lichman S, Wu M (2006) A study of hand shape use in tabletop gesture interaction. CHI’06 extended abstracts on human factors in computing systems—CHI EA’06. ACM Press, New York, pp 748–753Fender AR, Benko H, Wilson A (2017) MeetAlive : room-scale omni-directional display system for multi-user content and control sharing. In: Proceedings of the 2017 ACM international conference on interactive surfaces and spaces, pp 106–115Fernandez RAS, Sanchez-Lopez JL, Sampedro C et al (2016) Natural user interfaces for human–drone multi-modal interaction. In: 2016 international conference on unmanned aircraft systems (ICUAS). IEEE, New York, pp 1013–1022Garcia-Sanjuan F, Jaen J, Nacher V, Catala A (2015) Design and evaluation of a tangible-mediated robot for kindergarten instruction. In: Proceedings of the 12th international conference on advances in computer entertainment technology—ACE’15. ACM Press, New York, pp 1–11Garcia-Sanjuan F, Jaen J, Jurdi S (2016) Towards encouraging communication in hospitalized children through multi-tablet activities. In: Proceedings of the XVII international conference on human computer interaction, pp 29.1–29.4Gindling J, Ioannidou A, Loh J et al (1995) LEGOsheets: a rule-based programming, simulation and manipulation environment for the LEGO programmable brick. In: Proceedings of symposium on visual languages. IEEE Computer Society Press, New York, pp 172–179Gonzalez B, Borland J, Geraghty K (2009) Whole body interaction for child-centered multimodal language learning. In: Proceedings of the 2nd workshop on child, computer and interaction—WOCCI’09. ACM Press, New York, pp 1–5Grønbæk K, Iversen OS, Kortbek KJ et al (2007) Interactive floor support for kinesthetic interaction in children learning environments. In: Human–computer interaction—INTERACT 2007. Lecture notes in computer science, pp 361–375Guha ML, Druin A, Chipman G et al (2005) Working with young children as technology design partners. Commun ACM 48:39–42. https://doi.org/10.1145/1039539.1039567Hansen JP, Alapetite A, MacKenzie IS, Møllenbach E (2014) The use of gaze to control drones. In: Proceedings of the symposium on eye tracking research and applications—ETRA’14. ACM Press, New York, pp 27–34Henkemans OAB, Bierman BPB, Janssen J et al (2017) Design and evaluation of a personal robot playing a self-management education game with children with diabetes type 1. Int J Hum Comput Stud 106:63–76. https://doi.org/10.1016/j.ijhcs.2017.06.001Horn MS, Crouser RJ, Bers MU (2011) Tangible interaction and learning: the case for a hybrid approach. Pers Ubiquitous Comput 16:379–389. https://doi.org/10.1007/s00779-011-0404-2Hourcade JP (2015) Child computer interaction. CreateSpace Independent Publishing Platform, North CharlestonHöysniemi J, Hämäläinen P, Turkki L (2004) Wizard of Oz prototyping of computer vision based action games for children. Proceeding of the 2004 conference on interaction design and children building a community—IDC’04. ACM Press, New York, pp 27–34Höysniemi J, Hämäläinen P, Turkki L, Rouvi T (2005) Children’s intuitive gestures in vision-based action games. Commun ACM 48:44–50. https://doi.org/10.1145/1039539.1039568Hsiao H-S, Chen J-C (2016) Using a gesture interactive game-based learning approach to improve preschool children’s learning performance and motor skills. Comput Educ 95:151–162. https://doi.org/10.1016/j.compedu.2016.01.005Jokela T, Rezaei PP, Väänänen K (2016) Using elicitation studies to generate collocated interaction methods. In: Proceedings of the 18th international conference on human–computer interaction with mobile devices and services adjunct, pp 1129–1133. https://doi.org/10.1145/2957265.2962654Jones B, Benko H, Ofek E, Wilson AD (2013) IllumiRoom: peripheral projected illusions for interactive experiences. In: Proceedings of the SIGCHI conference on human factors in computing systems—CHI’13, pp 869–878Jones B, Shapira L, Sodhi R et al (2014) RoomAlive: magical experiences enabled by scalable, adaptive projector-camera units. In: Proceedings of the 27th annual ACM symposium on user interface software and technology—UIST’14, pp 637–644Kaminski M, Pellino T, Wish J (2002) Play and pets: the physical and emotional impact of child-life and pet therapy on hospitalized children. Child Heal Care 31:321–335. https://doi.org/10.1207/S15326888CHC3104_5Karam M, Schraefel MC (2005) A taxonomy of gestures in human computer interactions. In: Technical report in electronics and computer science, pp 1–45Kistler F, André E (2013) User-defined body gestures for an interactive storytelling scenario. Lect Notes Comput Sci (including subser Lect Notes Artif Intell Lect Notes Bioinform) 8118:264–281. https://doi.org/10.1007/978-3-642-40480-1_17Konda KR, Königs A, Schulz H, Schulz D (2012) Real time interaction with mobile robots using hand gestures. In: Proceedings of the seventh annual ACM/IEEE international conference on human–robot interaction—HRI’12. ACM Press, New York, pp 177–178Kray C, Nesbitt D, Dawson J, Rohs M (2010) User-defined gestures for connecting mobile phones, public displays, and tabletops. In: Proceedings of the 12th international conference on human computer interaction with mobile devices and services—MobileHCI’10. ACM Press, New York, pp 239–248Kurdyukova E, Redlin M, André E (2012) Studying user-defined iPad gestures for interaction in multi-display environment. In: Proceedings of the 2012 ACM international conference on intelligent user interfaces—IUI’12. ACM Press, New York, pp 93–96Lambert V, Coad J, Hicks P, Glacken M (2014) Social spaces for young children in hospital. Child Care Health Dev 40:195–204. https://doi.org/10.1111/cch.12016Lee S-S, Chae J, Kim H et al (2013) Towards more natural digital content manipulation via user freehand gestural interaction in a living room. In: Proceedings of the 2013 ACM international joint conference on pervasive and ubiquitous computing—UbiComp’13. ACM Press, New York, p 617Malinverni L, Mora-Guiard J, Pares N (2016) Towards methods for evaluating and communicating participatory design: a multimodal approach. Int J Hum Comput Stud 94:53–63. https://doi.org/10.1016/j.ijhcs.2016.03.004Mann HB, Whitney DR (1947) On a test of whether one of two random variables is stochastically larger than the other. Ann Math Stat 18:50–60. https://doi.org/10.1214/aoms/1177730491Marco J, Cerezo E, Baldassarri S et al (2009) Bringing tabletop technologies to kindergarten children. In: Proceedings of the 23rd British HCI Group annual conference on people and computers: celebrating people and technology, pp 103–111Michaud F, Caron S (2002) Roball, the rolling robot. Auton Robots 12:211–222. https://doi.org/10.1023/A:1014005728519Micire M, Desai M, Courtemanche A et al (2009) Analysis of natural gestures for controlling robot teams on multi-touch tabletop surfaces. In: Proceedings of the ACM international conference on interactive tabletops and surfaces—ITS’09. ACM Press, New York, pp 41–48Mora-Guiard J, Crowell C, Pares N, Heaton P (2016) Lands of fog: helping children with autism in social interaction through a full-body interactive experience. In: Proceedings of the 15th international conference on interaction design and children—IDC’16. ACM Press, New York, pp 262–274Morris MR (2012) Web on the wall: insights from a multimodal interaction elicitation study. In: Proceedings of the 2012 ACM international conference on interactive tabletops and surfaces. ACM Press, New York, pp 95–104Morris MR, Wobbrock JO, Wilson AD (2010) Understanding users’ preferences for surface gestures. Proc Graph Interface 2010:261–268Nacher V, Garcia-Sanjuan F, Jaen J (2016) Evaluating the usability of a tangible-mediated robot for kindergarten children instruction. In: 2016 IEEE 16th international conference on advanced learning technologies (ICALT). IEEE, New York, pp 130–132Nahapetyan VE, Khachumov VM (2015) Gesture recognition in the problem of contactless control of an unmanned aerial vehicle. Optoelectron Instrum Data Process 51:192–197. https://doi.org/10.3103/S8756699015020132Obaid M, Häring M, Kistler F et al (2012) User-defined body gestures for navigational control of a humanoid robot. In: Lecture notes in computer science (including subseries lecture notes in artificial intelligence and lecture notes in bioinformatics), pp 367–377Obaid M, Kistler F, Häring M et al (2014) A framework for user-defined body gestures to control a humanoid robot. Int J Soc Robot 6:383–396. https://doi.org/10.1007/s12369-014-0233-3Obaid M, Kistler F, Kasparavičiūtė G, et al (2016) How would you gesture navigate a drone?: a user-centered approach to control a drone. In: Proceedings of the 20th international academic Mindtrek conference—AcademicMindtrek’16. ACM Press, New York, pp 113–121Pares N, Soler M, Sanjurjo À et al (2005) Promotion of creative activity in children with severe autism through visuals in an interactive multisensory environment. In: Proceeding of the 2005 conference on interaction design and children—IDC’05. ACM Press, New York, pp 110–116Pfeil K, Koh SL, LaViola J (2013) Exploring 3D gesture metaphors for interaction with unmanned aerial vehicles. In: Proceedings of the 2013 international conference on intelligent user interfaces—IUI’13, pp 257–266. https://doi.org/10.1145/2449396.2449429Piaget J (1956) The child’s conception of space. Norton, New YorkPiaget J (1973) The child and reality: problems of genetic psychology. Grossman, New YorkPiumsomboon T, Clark A, Billinghurst M, Cockburn A (2013) User-defined gestures for augmented reality. CHI’13 extended abstracts on human factors in computing systems—CHI EA’13. ACM Press, New York, pp 955–960Pons P, Carrión A, Jaen J (2018) Remote interspecies interactions: improving humans and animals’ wellbeing through mobile playful spaces. Pervasive Mob Comput. https://doi.org/10.1016/j.pmcj.2018.12.003Puranam MB (2005) Towards full-body gesture analysis and recognition. University of Kentucky, LexingtonPyryeskin D, Hancock M, Hoey J (2012) Comparing elicited gestures to designer-created gestures for selection above a multitouch surface. In: Proceedings of the 2012 ACM international conference on interactive tabletops and surfaces—ITS’12. ACM Press, New York, pp 1–10Raffle HS, Parkes AJ, Ishii H (2004) Topobo: a constructive assembly system with kinetic memory. System 6:647–654. https://doi.org/10.1145/985692.985774Read JC, Markopoulos P (2013) Child–computer interaction. Int J Child-Comput Interact 1:2–6. https://doi.org/10.1016/j.ijcci.2012.09.001Read JC, Macfarlane S, Casey C (2002) Endurability, engagement and expectations: measuring children’s fun. In: Interaction design and children, pp 189–198Read JC, Markopoulos P, Parés N et al (2008) Child computer interaction. In: Proceeding of the 26th annual CHI conference extended abstracts on human factors in computing systems—CHI’08. ACM Press, New York, pp 2419–2422Robins B, Dautenhahn K (2014) Tactile interactions with a humanoid robot: novel play scenario implementations with children with autism. Int J Soc Robot 6:397–415. https://doi.org/10.1007/s12369-014-0228-0Robins B, Dautenhahn K, Te Boekhorst R, Nehaniv CL (2008) Behaviour delay and robot expressiveness in child–robot interactions: a user study on interaction kinesics. In: Proceedings of the 3rd ACMIEEE international conference on human robot interaction, pp 17–24. https://doi.org/10.1145/1349822.1349826Ruiz J, Li Y, Lank E (2011) User-defined motion gestures for mobile interaction. In: Proceedings of the 2011 annual conference on human factors in computing systems—CHI’11. ACM Press, New York, p 197Rust K, Malu M, Anthony L, Findlater L (2014) Understanding childdefined gestures and children’s mental models for touchscreen tabletop interaction. In: Proceedings of the 2014 conference on interaction design and children—IDC’14. ACM Press, New York, pp 201–204Salter T, Dautenhahn K, Te Boekhorst R (2006) Learning about natural human-robot interaction styles. Robot Auton Syst 54:127–134. https://doi.org/10.1016/j.robot.2005.09.022Sanghvi J, Castellano G, Leite I et al (2011) Automatic analysis of affective postures and body motion to detect engagement with a game companion. In: Proceedings of the 6th international conference on human–robot interaction—HRI’11. ACM Press, New York, pp 305–311Sanna A, Lamberti F, Paravati G, Manuri F (2013) A Kinect-based natural interface for quadrotor control. Entertain Comput 4:179–186. https://doi.org/10.1016/j.entcom.2013.01.001Sato E, Yamaguchi T, Harashima F (2007) Natural interface using pointing behavior for human–robot gestural interaction. IEEE Trans Ind Electron 54:1105–1112. https://doi.org/10.1109/TIE.2007.892728Schaper M-M, Pares N (2016) Making sense of body and space through full-body interaction design. In: Proceedings of the 15th international conference on interaction design and children—IDC’16. ACM Press, New York, pp 613–618Schaper M-M, Malinverni L, Pares N (2015) Sketching through the body: child-generated gestures in full-body interaction design. In: Proceedings of the 14th international conference on interaction design and children—IDC’15. ACM Press, New York, pp 255–258Seyed T, Burns C, Costa Sousa M et al (2012) Eliciting usable gestures for multi-display environments. In: Proceedings of the 2012 ACM international conference on interactive tabletops and surfaces—ITS’12. ACM Press, New York, p 41Shimon SSA, Morrison-Smith S, John N et al (2015) Exploring user-defined back-of-device gestures for mobile devices. In: Proceedings of the 17th international conference on human–computer interaction with mobile devices and services—MobileHCI’15. ACM Press, New York, pp 227–232Sipitakiat A, Nusen N (2012) Robo-blocks: a tangible programming system with debugging for children. In: Proceedings of the 11th international conference on interaction design and children—IDC’12. ACM Press, New York, p 98Soler-Adillon J, Ferrer J, Pares N (2009) A novel approach to interactive playgrounds: the interactive slide project. In: Proceedings of the 8th international conference on interaction design and children—IDC’09. ACM Press, New York, pp 131–139Stiefelhagen R, Fogen C, Gieselmann P et al (2004) Natural human–robot interaction using speech, head pose and gestures. In: 2004 IEEE/RSJ international conference on intelligent robots and systems (IROS) (IEEE Cat. No. 04CH37566). IEEE, New York, pp 2422–2427Subrahmanyam K, Greenfield PM (1994) Effect of video game practice on spatial skills in girls and boys. J Appl Dev Psychol 15:13–32. https://doi.org/10.1016/0193-3973(94)90004-3Sugiyama J, Tsetserukou D, Miura J (2011) NAVIgoid: robot navigation with haptic vision. In: SIGGRAPH Asia 2011 emerging technologies SA’11, vol 15, p 4503. https://doi.org/10.1145/2073370.2073378Takahashi T, Morita M, Tanaka F (2012) Evaluation of a tricycle-style teleoperational interface for children: a comparative experiment with a video game controller. In: 2012 IEEE RO-MAN: the 21st IEEE international symposium on robot and human interactive communication. IEEE, New York, pp 334–338Tanaka F, Takahashi T (2012) A tricycle-style teleoperational interface that remotely controls a robot for classroom children. In: Proceedings of the seventh annual ACM/IEEE international conference on human–robot interaction—HRI’12. ACM Press, New York, pp 255–256Tjaden L, Tong A, Henning P et al (2012) Children’s experiences of dialysis: a systematic review of qualitative studies. Arch Dis Child 97:395–402. https://doi.org/10.1136/archdischild-2011-300639Vatavu R-D (2012) User-defined gestures for free-hand TV control. In: Proceedings of the 10th European conference on interactive TV and video—EuroiTV’12. ACM Press, New York, pp 45–48Vatavu R-D (2017) Smart-Pockets: body-deictic gestures for fast access to personal data during ambient interactions. Int J Hum Comput Stud 103:1–21. https://doi.org/10.1016/j.ijhcs.2017.01.005Vatavu R-D, Wobbrock JO (2015) Formalizing agreement analysis for elicitation studies: new measures, significance test, and toolkit. In: Proceedings of the 33rd annual ACM conference on human factors in computing systems—CHI’15. ACM Press, New York, pp 1325–1334Vatavu R-D, Wobbrock JO (2016) Between-subjects elicitation studies: formalization and tool support. In: Proceedings of the 2016 CHI conference on human factors in computing systems—CHI’16. ACM Press, New York, pp 3390–3402Voyer D, Voyer S, Bryden MP (1995) Magnitude of sex differences in spatial abilities: a meta-analysis and consideration of critical variables. Psychol Bull 117:250–270. https://doi.org/10.1037/0033-2909.117.2.250Wainer J, Robins B, Amirabdollahian F, Dautenhahn K (2014) Using the humanoid robot KASPAR to autonomously play triadic games and facilitate collaborative play among children with autism. IEEE Trans Auton Ment Dev 6:183–199. https://doi.org/10.1109/TAMD.2014.2303116Wang Y, Zhang L (2015) A track-based gesture recognition algorithm for Kinect. Appl Mech Mater 738–7399:334–338. https://doi.org/10.4028/www.scientific.net/AMM.738-739.334

    An ACO-based personalized learning technique in support of people with acquired brain injury

    Full text link
    This is the author’s version of a work that was accepted for publication in Applied Soft Computing . Changes resulting from the publishing process, such as peer review, editing, corrections, structural formatting, and other quality control mechanisms may not be reflected in this document. Changes may have been made to this work since it was submitted for publication. A definitive version was subsequently published in Applied Soft Computing 47 (2016) 316–331. DOI 10.1016/j.asoc.2016.04.039The ever-increasing cases of acquired brain injury (ABI), especially among young people, have prompted a rapid progress in research involving neurological disorders. One important path is the concept of relearning, which attempts to help people regain basic motor and cognitive skills lost due to illness or accident. The goals of relearning are twofold. First, there must exist a way to properly assess the necessities of an affected person, leading to a diagnosis, followed by a recommendation regarding the exercises, tests and tasks to perform; and second, there must be a way to confirm the results obtained from these recommendations in order to fine-tune and personalize the relearning process. This presents a challenge, as there is a deeply-rooted duality between the personalized and the generalized approach. In this work we propose a personalization algorithm based on the ant colony optimization (ACO), which is a bio-inspired meta-heuristic. As we show, the stochastic nature of ants has certain similarities to the human learning process. We combine the adaptive and exploratory capabilities of ACO systems to respond to rapidly changing environments and the ubiquitous human factor. Finally, we test the proposed solution extensively in various scenarios, achieving high quality results. © 2016 Elsevier B.V. All rights reservedThis research has been funded by the Spanish Ministry of Economy and Competitiveness and by the FEDER funds of the EU under the project SUPEREMOS (TIN2014-60077-R) and insPIre (TIN2012-34003). Kamil Krynicki is supported by the FPI fellowship from Universitat Politecnica de Valencia.Krynicki, K.; Jaén Martínez, FJ.; Navarro, E. (2016). An ACO-based personalized learning technique in support of people with acquired brain injury. Applied Soft Computing. 47:316-331. doi:10.1016/j.asoc.2016.04.039S3163314

    Evaluating Simultaneous Visual Instructions with Kindergarten Children on Touchscreen Devices

    Full text link
    [EN] A myriad of educational applications using tablets and multi-touch technology for kindergarten children have been developed in the last decade. However, despite the possible benefits of using visual prompts to communicate information to kindergarteners, these visual techniques have not been fully studied yet. This article therefore investigates kindergarten children¿s abilities to understand and follow several visual prompts about how to proceed and interact in a virtual 2D world. The results show that kindergarteners are able to effectively understand several visual prompts with different communication purposes despite being used simultaneously. The results also show that the use of the evaluated visual prompts to communicate data when playing reduces the number of interferences about technical nature fostering dialogues related to the learning activity guided by the instructors or caregivers. Hence, this work is a starting point for designing dialogic learning scenarios tailored to kindergarten children.This work is supported by the Spanish Ministry of Economy and Competitiveness and funded by the European Development Regional Fund (EDRF-FEDER) with Project TIN2014-60077-R; by VALi+d program from Conselleria d¿Educació, Cultura i Esport (Generalitat Valenciana) under the fellowship ACIF/2014/214, and by the FPU program from Spanish Ministry of Education, Culture, and Sport under the fellowship FPU14/00136Nácher, V.; García-Sanjuan, F.; Jaén Martínez, FJ. (2020). Evaluating Simultaneous Visual Instructions with Kindergarten Children on Touchscreen Devices. International Journal of Human-Computer Interaction. 36(1):41-54. https://doi.org/10.1080/10447318.2019.1597576S4154361Allen, R., & Scofield, J. (2010). Word learning from videos: more evidence from 2-year-olds. Infant and Child Development, 19(6), 649-661. doi:10.1002/icd.712Cristia, A., & Seidl, A. (2015). Parental Reports on Touch Screen Use in Early Childhood. PLOS ONE, 10(6), e0128338. doi:10.1371/journal.pone.0128338Derboven, J., De Roeck, D., & Verstraete, M. (2012). Semiotic analysis of multi-touch interface design: The MuTable case study. International Journal of Human-Computer Studies, 70(10), 714-728. doi:10.1016/j.ijhcs.2012.05.005Egloff, T. H. (2004). Edutainment. Computers in Entertainment, 2(1), 13-13. doi:10.1145/973801.973822Fernández-López, Á., Rodríguez-Fórtiz, M. J., Rodríguez-Almendros, M. L., & Martínez-Segura, M. J. (2013). Mobile learning technology based on iOS devices to support students with special education needs. Computers & Education, 61, 77-90. doi:10.1016/j.compedu.2012.09.014Furió, D., González-Gancedo, S., Juan, M.-C., Seguí, I., & Rando, N. (2013). Evaluation of learning outcomes using an educational iPhone game vs. traditional game. Computers & Education, 64, 1-23. doi:10.1016/j.compedu.2012.12.001Hanna, L., Risden, K., & Alexander, K. (1997). Guidelines for usability testing with children. Interactions, 4(5), 9-14. doi:10.1145/264044.264045Honomichl, R. D., & Chen, Z. (2012). The role of guidance in children’s discovery learning. WIREs Cognitive Science, 3(6), 615-622. doi:10.1002/wcs.1199Hourcade, J. P. (2007). Interaction Design and Children. Foundations and Trends® in Human-Computer Interaction, 1(4), 277-392. doi:10.1561/1100000006Ioannou, A., Zaphiris, P., Loizides, F., & Vasiliou, C. (2013). Let’S Talk About Technology for Peace: A Systematic Assessment of Problem-Based Group Collaboration Around an Interactive Tabletop. Interacting with Computers, 27(2), 120-132. doi:10.1093/iwc/iwt061Keenan, T., Ruffman, T., & Olson, D. R. (1994). When do children begin to understand logical inference as a source of knowledge? Cognitive Development, 9(3), 331-353. doi:10.1016/0885-2014(94)90010-8Levine, S. C., Huttenlocher, J., Taylor, A., & Langrock, A. (1999). Early sex differences in spatial skill. Developmental Psychology, 35(4), 940-949. doi:10.1037/0012-1649.35.4.940Nacher, V., Garcia-Sanjuan, F., & Jaen, J. (2016). Interactive technologies for preschool game-based instruction: Experiences and future challenges. Entertainment Computing, 17, 19-29. doi:10.1016/j.entcom.2016.07.001Nacher, V., Jaen, J., & Catala, A. (2016). Evaluating Multitouch Semiotics to Empower Prekindergarten Instruction with Interactive Surfaces. Interacting with Computers, 29(2), 97-116. doi:10.1093/iwc/iww007Nacher, V., Jaen, J., Navarro, E., Catala, A., & González, P. (2015). Multi-touch gestures for pre-kindergarten children. International Journal of Human-Computer Studies, 73, 37-51. doi:10.1016/j.ijhcs.2014.08.004Nacher, V., Jurdi, S., Jaen, J., & Garcia-Sanjuan, F. (2019). Exploring visual prompts for communicating directional awareness to kindergarten children. International Journal of Human-Computer Studies, 126, 14-25. doi:10.1016/j.ijhcs.2019.01.003Neumann, M. M. (2017). Parent scaffolding of young children’s use of touch screen tablets. Early Child Development and Care, 188(12), 1654-1664. doi:10.1080/03004430.2016.1278215Pecora, N., Murray, J. P., & Wartella, E. A. (Eds.). (2009). Children and Television. doi:10.4324/9781410618047Plowman, L., Stevenson, O., Stephen, C., & McPake, J. (2012). Preschool children’s learning with technology at home. Computers & Education, 59(1), 30-37. doi:10.1016/j.compedu.2011.11.014Smith, S. P., Burd, E., & Rick, J. (2012). Developing, evaluating and deploying multi-touch systems. International Journal of Human-Computer Studies, 70(10), 653-656. doi:10.1016/j.ijhcs.2012.07.002Van der Meij, H., & van der Meij, J. (2014). A comparison of paper-based and video tutorials for software learning. Computers & Education, 78, 150-159. doi:10.1016/j.compedu.2014.06.003Vatavu, R.-D., Cramariuc, G., & Schipor, D. M. (2015). Touch interaction for children aged 3 to 6 years: Experimental findings and relationship to motor skills. International Journal of Human-Computer Studies, 74, 54-76. doi:10.1016/j.ijhcs.2014.10.00

    Envisioning Future Playful Interactive Environments for Animals

    Full text link
    The final publication is available at Springer via http://dx.doi.org/10.1007/978-981-287-546-4_6Play stands as one of the most natural and inherent behavior among the majority of living species, specifically humans and animals. Human play has evolved significantly over the years, and so have done the artifacts which allow us to play: from children playing tag games without any tools other than their bodies, to modern video games using haptic and wearable devices to augment the playful experience. However, this ludic revolution has not been the same for the humans’ closest companions, our pets. Recently, a new discipline inside the human–computer interaction (HCI) community, called animal–computer interaction (ACI), has focused its attention on improving animals’ welfare using technology. Several works in the ACI field rely on playful interfaces to mediate this digital communication between animals and humans. Until now, the development of these interfaces only comprises a single goal or activity, and its adaptation to the animals’ needs requires the developers’ intervention. This work analyzes the existing approaches, proposing a more generic and autonomous system aimed at addressing several aspects of animal welfare at a time: Intelligent Playful Environments for Animals. The great potential of these systems is discussed, explaining how incorporating intelligent capabilities within playful environments could allow learning from the animals’ behavior and automatically adapt the game to the animals’ needs and preferences. The engaging playful activities created with these systems could serve different purposes and eventually improve animals’ quality of life.This work was partially funded by the Spanish Ministry of Science andInnovation under the National R&D&I Program within the projects Create Worlds (TIN2010-20488) and SUPEREMOS (TIN2014-60077-R), and from Universitat Politècnica de València under Project UPV-FE-2014-24. It also received support from a postdoctoral fellowship within theVALi+d Program of the Conselleria d’Educació, Cultura I Esport (Generalitat Valenciana) awarded to Alejandro Catalá (APOSTD/2013/013). The work of Patricia Pons has been supported by the Universitat Politècnica de València under the “Beca de Excelencia” program and currently by an FPU fellowship from the Spanish Ministry of Education, Culture, and Sports (FPU13/03831).Pons Tomás, P.; Jaén Martínez, FJ.; Catalá Bolós, A. (2015). Envisioning Future Playful Interactive Environments for Animals. En More Playful User Interfaces: Interfaces that Invite Social and Physical Interaction. Springer. 121-150. https://doi.org/10.1007/978-981-287-546-4_6S121150Alfrink, K., van Peer, I., Lagerweij H, et al.: Pig Chase. Playing with Pigs project. (2012) www.playingwithpigs.nlAmat, M., Camps, T., Le, Brech S., Manteca, X.: Separation anxiety in dogs: the implications of predictability and contextual fear for behavioural treatment. Anim. Welf. 23(3), 263–266 (2014). doi: 10.7120/09627286.23.3.263Barker, S.B., Dawson, K.S.: The effects of animal-assisted therapy on anxiety ratings of hospitalized psychiatric patients. Psychiatr. Serv. 49(6), 797–801 (1998)Bateson, P., Martin, P.: Play, Playfulness, Creativity and Innovation. Cambridge University Press, New York (2013)Bekoff, M., Allen, C.: Intentional communication and social play: how and why animals negotiate and agree to play. In: Bekoff, M., Byers, J.A. (eds.) Animal Play Evolutionary. Comparative and Ecological Perspectives, pp. 97–114. Cambridge University Press, New York (1997)Burghardt, G.M.: The Genesis of Animal Play. Testing the Limits. MIT Press, Cambridge (2006)Catalá, A., Pons, P., Jaén, J., et al.: A meta-model for dataflow-based rules in smart environments: evaluating user comprehension and performance. Sci. Comput. Prog. 78(10), 1930–1950 (2013). doi: 10.1016/j.scico.2012.06.010Cheok, A.D., Tan, R.T.K.C., Peiris, R.L., et al.: Metazoa ludens: mixed-reality interaction and play for small pets and humans. IEEE Trans. Syst. Man. Cybern.—Part A Syst. Hum. 41(5), 876–891 (2011). doi: 10.1109/TSMCA.2011.2108998Costello, B., Edmonds, E.: A study in play, pleasure and interaction design. In: Proceedings of the 2007 Conference on Designing Pleasurable Products and Interfaces, pp. 76–91 (2007)Csikszentmihalyi, M.: Beyond Boredom and Anxiety. The Experience of Play in Work and Games. Jossey-Bass Publishers, Hoboken (1975)Filan, S.L., Llewellyn-Jones, R.H.: Animal-assisted therapy for dementia: a review of the literature. Int. Psychogeriatr. 18(4), 597–611 (2006). doi: 10.1017/S1041610206003322García-Herranz, M., Haya, P.A., Alamán, X.: Towards a ubiquitous end-user programming system for smart spaces. J. Univ. Comput. Sci. 16(12), 1633–1649 (2010). doi: 10.3217/jucs-016-12-1633Hirskyj-Douglas, I., Read, J.C.: Who is really in the centre of dog computer interaction? In: Adjunct Proceedings of the 11th Conference on Advances in Computer Entertainment—Workshop on Animal Human Computer Interaction (2014)Hu, F., Silver, D., Trude, A.: LonelyDog@Home. In: International Conference Web Intelligence Intelligent Agent Technology—Workshops, 2007 IEEE/WIC/ACM IEEE, pp. 333–337, (2007)Huizinga, J.: Homo Ludens. Wolters-Noordhoff, Groningen (1985)Kamioka, H., Okada, S., Tsutani, K., et al.: Effectiveness of animal-assisted therapy: a systematic review of randomized controlled trials. Complement. Ther. Med. 22(2), 371–390 (2014). doi: 10.1016/j.ctim.2013.12.016Lee, S.P., Cheok, A.D., James, T.K.S., et al.: A mobile pet wearable computer and mixed reality system for human–poultry interaction through the internet. Pers. Ubiquit. Comput. 10(5), 301–317 (2006). doi: 10.1007/s00779-005-0051-6Leo, K., Tan, B.: User-tracking mobile floor projection virtual reality game system for paediatric gait and dynamic balance training. In: Proceedings of the 4th International Convention on Rehabilitation Engineering and Assistive Technology pp. 25:1–25:4 (2010)Mancini, C.: Animal-computer interaction: a manifesto. Mag. Interact. 18(4), 69–73 (2011). doi: 10.1145/1978822.1978836Mancini, C.: Animal-computer interaction (ACI): changing perspective on HCI, participation and sustainability. CHI ’13 Extended Abstracts on Human Factors in Computing Systems. ACM Press, New York, pp. 2227–2236 (2013)Mancini, C., van der Linden, J.: UbiComp for animal welfare: envisioning smart environments for kenneled dogs. In: Proceedings of the 2014 ACM International Joint Conference on Pervasive and Ubiquitous Computing, pp. 117–128 (2014)Mancini, C., Harris, R., Aengenheister, B., Guest, C.: Re-centering multispecies practices: a canine interface for cancer detection dogs. In: Proceedings of the SIGCHI Conference on Human Factors in Computing System, pp. 2673–2682 (2015)Mancini, C., van der Linden, J., Bryan, J., Stuart, A.: Exploring interspecies sensemaking: dog tracking semiotics and multispecies ethnography. In: Proceedings of the 2012 ACM Conference on Ubiquitous Computing—UbiComp ’12. ACM Press, New York, pp. 143–152 (2012)Mankoff, D., Dey, A.K., Mankoff, J., Mankoff, K.: Supporting interspecies social awareness: using peripheral displays for distributed pack awareness. In: Proceedings of the 18th Annual ACM Symposium on User interface Software and Technology, pp. 253–258 (2005)Maternaghan, C., Turner, K.J.: A configurable telecare system. In: Proceedings of the 4th International Conference on Pervasive Technologies Related to Assistive Environments—PETRA ’11. ACM Press, New York, pp. 14:1–14:8 (2011)Matsuzawa, T.: The Ai project: historical and ecological contexts. Anim. Cogn. 6(4), 199–211 (2003). doi: 10.1007/s10071-003-0199-2McGrath, R.E.: Species-appropriate computer mediated interaction. CHI ‘09 Extended Abstracts on Human Factors in Computing Systems. ACM Press, New York, pp. 2529–2534 (2009)Mocholí, J.A., Jaén, J., Catalá, A.: A model of affective entities for effective learning environments. In: Innovations in Hybrid Intelligent Systems, pp. 337–344 (2007)Nijholt, A. (ed.): Playful User Interfaces. Springer, Singapore (2014)Norman, D.A.: The invisible computer. MIT Press, Cambridge (1998)Noz, F., An, J.: Cat cat revolution: an interspecies gaming experience. In: Proceedings of the SIGCHI Conference on Human Factors in Computing Systems, pp. 2661–2664 (2011)Paldanius, M., Kärkkäinen, T., Väänänen-Vainio-Mattila, K., et al.: Communication technology for human-dog interaction: exploration of dog owners’ experiences and expectations. In: Proceedings of the SIGCHI Conference on Human Factors in Computing Systems. ACM Press, New York, pp. 2641–2650 (2011)Picard, R.W.: Affective Computing. MIT Press, Cambridge (1997)Pons, P., Jaén, J., Catalá, A.: Animal ludens: building intelligent playful environments for animals. In: Adjunct Proceedings of the 11th Conference on Advances in Computer Entertainment—Workshop on Animal Human Computer Interaction (2014)Resner, B.: Rover@Home: Computer Mediated Remote Interaction Between Humans and Dogs. M.Sc. thesis, Massachusetts Institute of Technology, Cambridge (2001)Ritvo, S.E., Allison, R.S.: Challenges related to nonhuman animal-computer interaction: usability and “liking”. In: Adjunct Proceedings of the 11th Conference on Advances in Computer Entertainment—Workshop on Animal Human Computer Interaction (2014)Robinson, C., Mncini, C., Van Der Linden, J., et al.: Canine-centered interface design: supporting the work of diabetes alert dogs. In: Proceedings of the SIGCHI Conference on Human Factors in Computing Systems, pp. 3757–3766 (2014)Rumbaugh, D.M.: Language Learning by a Chimpanzee: The LANA Project. Academic Press, New York (1977)Rumbaugh, D.M.: Apes and their future in comparative psychology. Eye Psi Chi 18(1), 16–19 (2013)Rumbaugh, D.M., Gill, T.V., Brown, J.V., et al.: A computer-controlled language training system for investigating the language skills of young apes. Behav. Res. Methods Instrum. 5(5), 385–392 (1973)Schwartz, S.: Separation anxiety syndrome in cats: 136 cases (1991–2000). J. Am. Vet. Med. Assoc. 220(7), 1028–1033 (2002). doi: 10.2460/javma.2002.220.1028Schwartz, S.: Separation anxiety syndrome in dogs and cats. J. Am. Vet. Med. Assoc. 222(11), 1526–1532 (2003)Solomon, O.: What a dog can do: children with autism and therapy dogs in social interaction. Ethos J. Soc. Psychol. Anthropol. 38(1), 143–166 (2010). doi: 10.1111/j.1548-1352.2010.01085.xTeh, K.S., Lee, S.P., Cheok, A.D.: Poultry. Internet: a remote human-pet interaction system. In: CHI ’06 Extended Abstracts on Human Factors in Computing Systems, pp. 251–254 (2006)Väätäjä, H., Pesonen, E.: Ethical issues and guidelines when conducting HCI studies with animals. In: CHI ’13 Extended Abstracts on Human Factors in Computing Systems, pp. 2159–2168 (2013)Väätäjä, H.: Animal welfare as a design goal in technology mediated human-animal interaction—opportunities with haptics. In: Adjunct Proceedings of the 11th Conference on Advances in Computer Entertainment—Workshop on Animal Human Computer Interaction (2014)Weilenmann, A., Juhlin, O.: Understanding people and animals. In: Proceedings of the SIGCHI Conference on Human Factors in Computing Systems—CHI ’11. ACM Press, New York, pp. 2631–2640 (2011)Weiser, M.: The computer for the 21st century. Sci. Am. 265(3), 94–104 (1991)Westerlaken, M., Gualeni, S., Geurtsen, A.: Grounded zoomorphism: an evaluation methodology for ACI design. In: Adjunct Proceedings of the 11th Conference on Advances in Computer Entertainment—Workshop on Animal Human Computer Interaction (2014)Westerlaken, M., Gualeni, S.: Felino: the philosophical practice of making an interspecies videogame. Philosophy of Computer Games Conference, pp. 1–12 (2014)Wingrave, C.A., Rose, J., Langston, T., LaViola, J.J.J.: Early explorations of CAT: canine amusement and training. In: CHI ’10 Extended Abstracts on Human Factors in Computing Systems, pp. 2661–2669 (2010

    Examining the Usability of Touch Screen Gestures for Children With Down Syndrome

    Full text link
    [EN] The use of multi-touch devices for all types of users (from children to the elderly) has grown considerably in the recent years. However, despite the huge interest in this technology there is a lack of research addressing usability studies on children with Down's Syndrome. This article evaluates the abilities of these children (aged from 5 to 10 years) when performing a basic set of multi-touch gestures (tap, double tap, long press, drag, scale up and down, rotation) in tablet devices. The results show that regardless of their more limited motor skills, DS children are able to perform most of the evaluated multi-touch gestures with success rates close to 100% and that this technology could be fully exploited for developing applications targeted specifically at this type of user.Spanish Ministry of Economy and Competitiveness and funded by the European Development Regional Fund (EDRF-FEDER) with the project TIN2014-60077-R (SUPEREMOS). This work was also supported by a pre-doctoral fellowship within the Formacion de Profesorado Universitario (FPU) program from the Spanish Ministry of Education, Culture and Sports to V. Nacher (FPU14/00136) and by a pre-doctoral scholarship given by the SENESCYT (Secretaria Nacional de Educacion Superior, Ciencia y Tecnologia e Innovacion) of the government of Ecuador (No. 381-2012) to Doris Caliz.NĂĄcher-Soler, VE.; CĂĄliz, D.; JaĂŠn MartĂ­nez, FJ.; MartĂ­nez, L. (2018). Examining the Usability of Touch Screen Gestures for Children With Down Syndrome. Interacting with Computers. 30(3):258-272. https://doi.org/10.1093/iwc/iwy011S25827230

    A diffusion-based ACO resource discovery framework for dynamic p2p networks

    Full text link
    Š 2013 IEEE. Personal use of this material is permitted. Permission from IEEE must be obtained for all other uses, in any current or future media, including reprinting/republishing this material for advertising or promotional purposes, creating new collective works, for resale or redistribution to servers or lists, or reuse of any copyrighted component of this work in other worksThe Ant Colony Optimization (ACO) has been a very resourceful metaheuristic over the past decade and it has been successfully used to approximately solve many static NP-Hard problems. There is a limit, however, of its applicability in the field of p2p networks; derived from the fact that such networks have the potential to evolve constantly and at a high pace, rendering the already-established results useless. In this paper we approach the problem by proposing a generic knowledge diffusion mechanism that extends the classical ACO paradigm to better deal with the p2p's dynamic nature. Focusing initially on the appearance of new resources in the network we have shown that it is possible to increase the efficiency of ant routing by a significant margin.Kamil Krynicki is supported by a FPI fellowship from the Universitat Politècnica de València with reference number 3117. This work received financial support from the Spanish Ministry of Education under the National Strategic Program of Research and Project TSI2010-20488.Krynicki, KK.; JaÊn Martínez, FJ.; Catalå Bolós, A. (2013). A diffusion-based ACO resource discovery framework for dynamic p2p networks. En 2013 IEEE Congress on Evolutionary Computation. IEEE. 860-867. https://doi.org/10.1109/CEC.2013.6557658S86086

    Augmented Tangible Surfaces to Support Cognitive Games for Ageing People

    Full text link
    The final publication is available at Springer via http://dx.doi.org/10.1007/978-3-319-19695-4_27The continuous and rapidly increasing elderly population requires a revision of technology design in order to devise systems usable and meaningful for this social group. Most applications for ageing people are built to provide supporting services, taking into account the physical and cognitive abilities that decrease over time. However, this paper focuses on building technology to improve such capacities, or at least slow down their decline, through cognitive games. This is achieved by means of a digitally-augmented table-like surface that combines touch with tangible input for a more natural, intuitive, and appealing means of interaction. Its construction materials make it an affordable device likely to be used in retirement homes in the context of therapeutic activities, and its form factor enables a versatile, quick, and scalable configuration, as well as a socializing experience.This work received financial support from Spanish Ministry of Economy and Competitiveness under the National Strategic Program of Research and Project TIN2010-20488, and from Universitat Politécnica de Valencia under Project UPV-FE-2014-24. It is also supported by fellowships APOST D/2013/013 and ACIF/2014/214 within the VALi+d program from Conselleria d’Educació, Cultura i Esport (GVA).García Sanjuan, F.; Jaén Martínez, FJ.; Catalá Bolós, A. (2015). Augmented Tangible Surfaces to Support Cognitive Games for Ageing People. En Ambient Intelligence - Software and Applications. Springer. 263-271. doi:10.1007/978-3-319-19695-4_27S26327

    Assessing machine learning classifiers for the detection of animals' behavior using depth-based tracking

    Full text link
    [EN] There is growing interest in the automatic detection of animals' behaviors and body postures within the field of Animal Computer Interaction, and the benefits this could bring to animal welfare, enabling remote communication, welfare assessment, detection of behavioral patterns, interactive and adaptive systems, etc. Most of the works on animals' behavior recognition rely on wearable sensors to gather information about the animals' postures and movements, which are then processed using machine learning techniques. However, non-wearable mechanisms such as depth-based tracking could also make use of machine learning techniques and classifiers for the automatic detection of animals' behavior. These systems also offer the advantage of working in set-ups in which wearable devices would be difficult to use. This paper presents a depth-based tracking system for the automatic detection of animals' postures and body parts, as well as an exhaustive evaluation on the performance of several classification algorithms based on both a supervised and a knowledge-based approach. The evaluation of the depth -based tracking system and the different classifiers shows that the system proposed is promising for advancing the research on animals' behavior recognition within and outside the field of Animal Computer Interaction. (C) 2017 Elsevier Ltd. All rights reserved.This work is funded by the European Development Regional Fund (EDRF-FEDER) and supported by Spanish MINECO with Project TIN2014-60077-R. It also received support from a postdoctoral fellowship within the VALi+d Program of the Conselleria d'Educacio, Cultura I Esport (Generalitat Valenciana) awarded to Alejandro Catala (APOSTD/2013/013). The work of Patricia Pons is supported by a national grant from the Spanish MECD (FPU13/03831). Special thanks to our cat participants and their owners, and many thanks to our feline caretakers and therapists, Olga, Asier and Julia, for their valuable collaboration and their dedication to animal wellbeing.Pons TomĂĄs, P.; JaĂŠn MartĂ­nez, FJ.; CatalĂĄ BolĂłs, A. (2017). Assessing machine learning classifiers for the detection of animals' behavior using depth-based tracking. Expert Systems with Applications. 86:235-246. https://doi.org/10.1016/j.eswa.2017.05.063S2352468

    Beyond the limits of digital interaction: should animals play with interactive environments?

    Full text link
    Our digital world evolves towards ubiquitous and intuitive scenarios, filled with interconnected and transparent computing devices which ease our daily activities. We have approached this evolution of technology in a strictly human-centric manner. There are, however, plenty of species, among them our pets, which could also profit from these technological advances. A new field in Computer Science, called Animal-Computer Interaction (ACI), aims at filling this technological gap by developing systems and interfaces specifically designed for animals. This paper envisions how ACI could be extended to enhance the most natural animal behavior: play. This work explains how interactive environments could become playful scenarios where animals enjoy, learn and interact with technology, improving their wellbeingThis work is partially funded by the Spanish Ministry of Science and Innovation under the National R&D&I Program within the project CreateWorlds (TIN2010-20488). The work of Patricia Pons is supported by an FPU fellowship from the Spanish Ministry of Education, Culture and Sports (FPU13/03831). It also received support from a postdoctoral fellowship within the VALi+d Program of the Conselleria d’Educació, Cultura I Esport (Generalitat Valenciana) awarded to Alejandro Catalá (APOSTD/2013/013). We also thank the Valencian Society for the Protection of Animals and Plants (SVPAP) for their cooperation.Pons Tomás, P.; Jaén Martínez, FJ.; Catalá Bolós, A. (2015). Beyond the limits of digital interaction: should animals play with interactive environments?. ACM. http://hdl.handle.net/10251/65361
    • …
    corecore